IS

Lilien, Gary L.

Topic Weight Topic Terms
0.418 support decision dss systems guidance process making environments decisional users features capabilities provide decision-making user
0.357 decision making decisions decision-making makers use quality improve performance managers process better results time managerial
0.206 feedback mechanisms mechanism ratings efficiency role effective study economic design potential economics discuss profile recent
0.132 learning mental conceptual new learn situated development working assumptions improve ess existing investigates capture advanced
0.125 options real investment option investments model valuation technology value analysis uncertainty portfolio models using context
0.123 research study influence effects literature theoretical use understanding theory using impact behavior insights examine influences
0.111 effort users advice ras trade-off recommendation agents difficulty decision make acceptance product loss trade-offs context

Focal Researcher     Coauthors of Focal Researcher (1st degree)     Coauthors of Coauthors (2nd degree)

Note: click on a node to go to a researcher's profile page. Drag a node to reallocate. Number on the edge is the number of co-authorships.

Rangaswamy, Arvind 2 van Bruggen, Gerrit H. 2 De Bruyn, Arnaud 1 Kayande, Ujwal 1
Starke, Katrin 1
decision process 1 decision quality 1 DSS 1 decision support systems 1
evaluations 1 feedback 1 learning 1 marketing models 1
mental models 1 resource allocation 1

Articles (2)

How Incorporating Feedback Mechanisms in a DSS Affects DSS Evaluations. (Information Systems Research, 2009)
Authors: Abstract:
    Model-based decision support systems (DSS) improve performance in many contexts that are data-rich, uncertain, and require repetitive decisions. But such DSS are often not designed to help users understand and internalize the underlying factors driving DSS recommendations. Users then feel uncertain about DSS recommendations, leading them to possibly avoid using the system. We argue that a DSS must be designed to induce an alignment of a decision maker's mental model with the decision model embedded in the DSS. Such an alignment requires effort from the decision maker and guidance from the DSS. We experimentally evaluate two DSS design characteristics that facilitate such alignment: (i) feedback on the upside potential for performance improvement and (ii) feedback on corrective actions to improve decisions. We show that, in tandem, these two types of DSS feedback induce decision makers to align their mental models with the decision model, a process we call deep learning, whereas individually these two types of feedback have little effect on deep learning. We also show that deep learning, in turn, improves user evaluations of the DSS. We discuss how our findings could lead to DSS design improvements and better returns on DSS investments.
DSS Effectiveness in Marketing Resource Allocation Decisions: Reality vs. Perception. (Information Systems Research, 2004)
Authors: Abstract:
    We study the process by which model-based decision support systems (DSSs) influence managerial decision making in the context of marketing budgeting and resource allocation. We focus on identifying whether and how DSSs influence the decision process (e.g., cognitive effort deployed, discussion quality, and decision alternatives considered) and, as a result, how these DSSs influence decision outcomes (e.g., profit and satisfaction both with the decision process and the outcome). We study two specific marketing resource allocation decisions in a laboratory context: sales effort allocation and customer targeting. We find that decision makers who use high-quality, model-based DSSs make objectively better decisions than do decision makers who only have access to a generic decision tool (Microsoft Excel). However, their subjective evaluations (perceptions) of both their decisions and the processes that lead to those decisions do not necessarily improve as a result of DSS use. And expert judges, serving as surrogates for top management, have a difficult time assessing the objective quality of those decisions. Our results suggest that what managers get from a high-quality DSS may be substantially better than what they see. To increase the inclination for managerial adoption and use of DSS, we must get users to "see" the benefits of using a DSS. Our results also suggest two ways to bridge the perception-reality gap: (1) improve the perceived value of the decision process by designing DSSs both to encourage discussion (e.g., by providing explanation and support for alternative recommendations) as well as to reduce the perceived complexity of the problem so that managers invest more cognitive effort in exploring additional options and (2) provide feedback on the likely market/business outcomes of various decision options.